Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
2.7 billion parameters
# 2.7 billion parameters
Gpt Neo 2.7B
MIT
GPT-Neo 2.7B is a 2.7 billion parameter Transformer language model replicated by EleutherAI based on the GPT-3 architecture, trained on the Pile dataset
Large Language Model
English
G
EleutherAI
52.68k
486
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase